|
The layered hidden Markov model (LHMM) is a statistical model derived from the hidden Markov model (HMM). A layered hidden Markov model (LHMM) consists of ''N'' levels of HMMs, where the HMMs on level ''i'' + 1 correspond to observation symbols or probability generators at level ''i''. Every level ''i'' of the LHMM consists of ''K''''i'' HMMs running in parallel.〔N. Oliver, A. Garg and E. Horvitz, "Layered Representations for Learning and Inferring Office Activity from Multiple Sensory Channels", Computer Vision and Image Understanding, vol. 96, p. 163–180, 2004. 〕 == Background == LHMMs are sometimes useful in specific structures because they can facilitate learning and generalization. For example, even though a fully connected HMM could always be used if enough training data were available, it is often useful to constrain the model by not allowing arbitrary state transitions. In the same way it can be beneficial to embed the HMM in a layered structure which, theoretically, may not be able to solve any problems the basic HMM cannot, but can solve some problems more efficiently because less training data is needed. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Layered hidden Markov model」の詳細全文を読む スポンサード リンク
|